Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Abdom Radiol (NY) ; 47(4): 1425-1434, 2022 04.
Artículo en Inglés | MEDLINE | ID: mdl-35099572

RESUMEN

PURPOSE: To present fully automated DL-based prostate cancer detection system for prostate MRI. METHODS: MRI scans from two institutions, were used for algorithm training, validation, testing. MRI-visible lesions were contoured by an experienced radiologist. All lesions were biopsied using MRI-TRUS-guidance. Lesions masks, histopathological results were used as ground truth labels to train UNet, AH-Net architectures for prostate cancer lesion detection, segmentation. Algorithm was trained to detect any prostate cancer ≥ ISUP1. Detection sensitivity, positive predictive values, mean number of false positive lesions per patient were used as performance metrics. RESULTS: 525 patients were included for training, validation, testing of the algorithm. Dataset was split into training (n = 368, 70%), validation (n = 79, 15%), test (n = 78, 15%) cohorts. Dice coefficients in training, validation sets were 0.403, 0.307, respectively, for AHNet model compared to 0.372, 0.287, respectively, for UNet model. In validation set, detection sensitivity was 70.9%, PPV was 35.5%, mean number of false positive lesions/patient was 1.41 (range 0-6) for UNet model compared to 74.4% detection sensitivity, 47.8% PPV, mean number of false positive lesions/patient was 0.87 (range 0-5) for AHNet model. In test set, detection sensitivity for UNet was 72.8% compared to 63.0% for AHNet, mean number of false positive lesions/patient was 1.90 (range 0-7), 1.40 (range 0-6) in UNet, AHNet models, respectively. CONCLUSION: We developed a DL-based AI approach which predicts prostate cancer lesions at biparametric MRI with reasonable performance metrics. While false positive lesion calls remain as a challenge of AI-assisted detection algorithms, this system can be utilized as an adjunct tool by radiologists.


Asunto(s)
Aprendizaje Profundo , Neoplasias de la Próstata , Inteligencia Artificial , Humanos , Imagen por Resonancia Magnética/métodos , Masculino , Próstata/patología , Neoplasias de la Próstata/diagnóstico por imagen , Neoplasias de la Próstata/patología
2.
Acad Radiol ; 29(8): 1159-1168, 2022 08.
Artículo en Inglés | MEDLINE | ID: mdl-34598869

RESUMEN

RATIONALE AND OBJECTIVES: Prostate MRI improves detection of clinically significant prostate cancer; however, its diagnostic performance has wide variation. Artificial intelligence (AI) has the potential to assist radiologists in the detection and classification of prostatic lesions. Herein, we aimed to develop and test a cascaded deep learning detection and classification system trained on biparametric prostate MRI using PI-RADS for assisting radiologists during prostate MRI read out. MATERIALS AND METHODS: T2-weighted, diffusion-weighted (ADC maps, high b value DWI) MRI scans obtained at 3 Tesla from two institutions (n = 1043 in-house and n = 347 Prostate-X, respectively) acquired between 2015 to 2019 were used for model training, validation, testing. All scans were retrospectively reevaluated by one radiologist. Suspicious lesions were contoured and assigned a PI-RADS category. A 3D U-Net-based deep neural network was used to train an algorithm for automated detection and segmentation of prostate MRI lesions. Two 3D residual neural network were used for a 4-class classification task to predict PI-RADS categories 2 to 5 and BPH. Training and validation used 89% (n = 1290 scans) of the data using 5 fold cross-validation, the remaining 11% (n = 150 scans) were used for independent testing. Algorithm performance at lesion level was assessed using sensitivities, positive predictive values (PPV), false discovery rates (FDR), classification accuracy, Dice similarity coefficient (DSC). Additional analysis was conducted to compare AI algorithm's lesion detection performance with targeted biopsy results. RESULTS: Median age was 66 years (IQR = 60-71), PSA 6.7 ng/ml (IQR = 4.7-9.9) from in-house cohort. In the independent test set, algorithm correctly detected 111 of 198 lesions leading to 56.1% (49.3%-62.6%) sensitivity. PPV was 62.7% (95% CI 54.7%-70.7%) with FDR of 37.3% (95% CI 29.3%-45.3%). Of 79 true positive lesions, 82.3% were tumor positive at targeted biopsy, whereas of 57 false negative lesions, 50.9% were benign at targeted biopsy. Median DSC for lesion segmentation was 0.359. Overall PI-RADS classification accuracy was 30.8% (95% CI 24.6%-37.8%). CONCLUSION: Our cascaded U-Net, residual network architecture can detect, classify cancer suspicious lesions at prostate MRI with good detection, reasonable classification performance metrics.


Asunto(s)
Aprendizaje Profundo , Neoplasias de la Próstata , Anciano , Algoritmos , Inteligencia Artificial , Humanos , Imagen por Resonancia Magnética , Masculino , Próstata/diagnóstico por imagen , Próstata/patología , Neoplasias de la Próstata/diagnóstico por imagen , Neoplasias de la Próstata/patología , Estudios Retrospectivos
3.
Acad Radiol ; 28(5): 664-670, 2021 05.
Artículo en Inglés | MEDLINE | ID: mdl-32307270

RESUMEN

INTRODUCTION: The aim of this study was to perform a quantitative assessment of the prostate anatomy with a focus on the relation of prostatic urethral anatomic variation to urinary symptoms. METHODS: This retrospective study involved patients undergoing magnetic resonance imaging for prostate cancer who were also assessed for lower urinary tract symptoms. Volumetric segmentations were utilized to derive the in vivo prostatic urethral length and urethral trajectory in coronal and sagittal planes using a piece-wise cubic spline function to derive the angle of the urethra within the prostate. Association of anatomical factors with urinary symptoms was evaluated using ordinal univariable and multivariable logistic regression with IPSS score cutoffs of ≤7, 8-19, and >20 to define mild, moderate, and severe symptoms, respectively. RESULTS: A total of 423 patients were included. On univariable analysis, whole prostate volume, transition zone volume, prostatic urethral length, urethral angle, and retrourethral volume were all significantly associated with worse urinary symptoms. On multivariable analysis prostatic urethral length was associated with urinary symptoms with a normalized odds ratio of 1.5 (95% confidence interval 1.0-2.2, p = 0.04). In a subset analysis of patients on alpha blockers, maximal urethral angle, transition zone volume as well as urethral length were all associated with worse urinary symptoms. CONCLUSION: Multiple parameters were associated with worse urinary symptoms on univariable analysis, but only prostatic urethral length was associated with worse urinary symptoms on multivariable analysis. This study demonstrates the ability of quantitative assessment of prostatic urethral anatomy to predict lower urinary tract symptoms.


Asunto(s)
Síntomas del Sistema Urinario Inferior , Hiperplasia Prostática , Humanos , Síntomas del Sistema Urinario Inferior/diagnóstico por imagen , Imagen por Resonancia Magnética , Masculino , Hiperplasia Prostática/complicaciones , Hiperplasia Prostática/diagnóstico por imagen , Estudios Retrospectivos , Uretra/diagnóstico por imagen
4.
AJR Am J Roentgenol ; 215(6): 1403-1410, 2020 12.
Artículo en Inglés | MEDLINE | ID: mdl-33052737

RESUMEN

OBJECTIVE. Deep learning applications in radiology often suffer from overfitting, limiting generalization to external centers. The objective of this study was to develop a high-quality prostate segmentation model capable of maintaining a high degree of performance across multiple independent datasets using transfer learning and data augmentation. MATERIALS AND METHODS. A retrospective cohort of 648 patients who underwent prostate MRI between February 2015 and November 2018 at a single center was used for training and validation. A deep learning approach combining 2D and 3D architecture was used for training, which incorporated transfer learning. A data augmentation strategy was used that was specific to the deformations, intensity, and alterations in image quality seen on radiology images. Five independent datasets, four of which were from outside centers, were used for testing, which was conducted with and without fine-tuning of the original model. The Dice similarity coefficient was used to evaluate model performance. RESULTS. When prostate segmentation models utilizing transfer learning were applied to the internal validation cohort, the mean Dice similarity coefficient was 93.1 for whole prostate and 89.0 for transition zone segmentations. When the models were applied to multiple test set cohorts, the improvement in performance achieved using data augmentation alone was 2.2% for the whole prostate models and 3.0% for the transition zone segmentation models. However, the best test-set results were obtained with models fine-tuned on test center data with mean Dice similarity coefficients of 91.5 for whole prostate segmentation and 89.7 for transition zone segmentation. CONCLUSION. Transfer learning allowed for the development of a high-performing prostate segmentation model, and data augmentation and fine-tuning approaches improved performance of a prostate segmentation model when applied to datasets from external centers.


Asunto(s)
Imagen por Resonancia Magnética , Reconocimiento de Normas Patrones Automatizadas , Neoplasias de la Próstata/diagnóstico por imagen , Conjuntos de Datos como Asunto , Aprendizaje Profundo , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos
5.
J Magn Reson Imaging ; 52(5): 1499-1507, 2020 11.
Artículo en Inglés | MEDLINE | ID: mdl-32478955

RESUMEN

BACKGROUND: The Prostate Imaging Reporting and Data System (PI-RADS) provides guidelines for risk stratification of lesions detected on multiparametric MRI (mpMRI) of the prostate but suffers from high intra/interreader variability. PURPOSE: To develop an artificial intelligence (AI) solution for PI-RADS classification and compare its performance with an expert radiologist using targeted biopsy results. STUDY TYPE: Retrospective study including data from our institution and the publicly available ProstateX dataset. POPULATION: In all, 687 patients who underwent mpMRI of the prostate and had one or more detectable lesions (PI-RADS score >1) according to PI-RADSv2. FIELD STRENGTH/SEQUENCE: T2 -weighted, diffusion-weighted imaging (DWI; five evenly spaced b values between b = 0-750 s/mm2 ) for apparent diffusion coefficient (ADC) mapping, high b-value DWI (b = 1500 or 2000 s/mm2 ), and dynamic contrast-enhanced T1 -weighted series were obtained at 3.0T. ASSESSMENT: PI-RADS lesions were segmented by a radiologist. Bounding boxes around the T2 /ADC/high-b value segmentations were stacked and saved as JPEGs. These images were used to train a convolutional neural network (CNN). The PI-RADS scores obtained by the CNN were compared with radiologist scores. The cancer detection rate was measured from a subset of patients who underwent biopsy. STATISTICAL TESTS: Agreement between the AI and the radiologist-driven PI-RADS scores was assessed using a kappa score, and differences between categorical variables were assessed with a Wald test. RESULTS: For the 1034 detection lesions, the kappa score for the AI system vs. the expert radiologist was moderate, at 0.40. However, there was no significant difference in the rates of detection of clinically significant cancer for any PI-RADS score in 86 patients undergoing targeted biopsy (P = 0.4-0.6). DATA CONCLUSION: We developed an AI system for assignment of a PI-RADS score on segmented lesions on mpMRI with moderate agreement with an expert radiologist and a similar ability to detect clinically significant cancer. LEVEL OF EVIDENCE: 4 TECHNICAL EFFICACY STAGE: 2.


Asunto(s)
Aprendizaje Profundo , Imágenes de Resonancia Magnética Multiparamétrica , Neoplasias de la Próstata , Inteligencia Artificial , Humanos , Imagen por Resonancia Magnética , Masculino , Neoplasias de la Próstata/diagnóstico por imagen , Estudios Retrospectivos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...